Spherical Structured Feature Maps for Kernel Approximation
نویسنده
چکیده
We propose Spherical Structured Feature (SSF) maps to approximate shift and rotation invariant kernels as well as b-order arc-cosine kernels (Cho & Saul, 2009). We construct SSF maps based on the point set on d − 1 dimensional sphere Sd−1. We prove that the inner product of SSF maps are unbiased estimates for above kernels if asymptotically uniformly distributed point set on Sd−1 is given. According to (Brauchart & Grabner, 2015), optimizing the discrete Riesz s-energy can generate asymptotically uniformly distributed point set on Sd−1. Thus, we propose an efficient coordinate decent method to find a local optimum of the discrete Riesz s-energy for SSF maps construction. Theoretically, SSF maps construction achieves linear space complexity and loglinear time complexity. Empirically, SSF maps achieve superior performance compared with other methods.
منابع مشابه
Spherical Random Features for Polynomial Kernels
Compact explicit feature maps provide a practical framework to scale kernel methods to large-scale learning, but deriving such maps for many types of kernels remains a challenging open problem. Among the commonly used kernels for nonlinear classification are polynomial kernels, for which low approximation error has thus far necessitated explicit feature maps of large dimensionality, especially ...
متن کاملCompact Random Feature Maps
Kernel approximation using random feature maps has recently gained a lot of interest. This is mainly due to their applications in reducing training and testing times of kernel based learning algorithms. In this work, we identify that previous approaches for polynomial kernel approximation create maps that can be rank deficient, and therefore may not utilize the capacity of the projected feature...
متن کاملSpherical-Homoscedastic Distributions: The Equivalency of Spherical and Normal Distributions in Classification
Many feature representations, as in genomics, describe directional data where all feature vectors share a common norm. In other cases, as in computer vision, a norm or variance normalization step, where all feature vectors are normalized to a common length, is generally used. These representations and pre-processing step map the original data from Rp to the surface of a hypersphere Sp−1. Such r...
متن کاملSpherical-Homoscedastic Distributions Spherical-Homoscedastic Distributions: The Equivalency of Spherical and Normal Distributions in Classification
Many feature representations, as in genomics, describe directional data where all feature vectors share a common norm. In other cases, as in computer vision, a norm or variance normalization step, where all feature vectors are normalized to a common length, is generally used. These representations and pre-processing step map the original data from R to the surface of a hypersphere Sp−1. Such re...
متن کاملA Unifying View of Explicit and Implicit Feature Maps for Structured Data: Systematic Studies of Graph Kernels
Non-linear kernel methods can be approximated by fast linear ones using suitable explicit feature maps allowing their application to large scale problems. To this end, explicit feature maps of kernels for vectorial data have been extensively studied. As many real-world data is structured, various kernels for complex data like graphs have been proposed. Indeed, many of them directly compute feat...
متن کامل